P-Sufficient Statistics for PAC Learning k-term-DNF Formulas through Enumeration

نویسندگان

  • Bruno Apolloni
  • Claudio Gentile
چکیده

Working in the framework of PAC-learning theory, we present special statistics for accomplishing in polynomial time proper learning of DNF boolean formulas having a fixed number of monomials. Our statistics turn out to be near sufficient for a large family of distribution laws—that we call butterfly distributions. We develop a theory of most powerful learning for analyzing the performance of learning algorithms, with particular reference to trade-offs between power and computational costs. Focusing attention on sample and time complexity, we prove that our algorithm works as efficiently as the best algorithms existing in the literature — while the latter only take care of subclasses of our family of distributions. Abbreviated title. P-sufficient statistics for learning k-term DNF.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Learning versus Refutation

Building on the work of Daniely et al. (STOC 2014, COLT 2016), we study the connection between computationally efficient PAC learning and refutation of constraint satisfaction problems. Specifically, we prove that for every concept class P, PAC-learning P is polynomially equivalent to “random-right-hand-side-refuting” (“RRHS-refuting”) a dual class P∗, where RRHS-refutation of a class Q refers ...

متن کامل

On Learning vs. Refutation

Building on the work of Daniely et al. (STOC 2014, COLT 2016), we study the connection between computationally efficient PAC learning and refutation of constraint satisfaction problems. Specifically, we prove that for every concept class P, PAC-learning P is polynomially equivalent to “random-right-hand-side-refuting” (“RRHS-refuting”) a dual class P∗, where RRHS-refutation of a class Q refers ...

متن کامل

A Query Algorithm for Agnostically Learning DNF?

Motivation: One of the most celebrated results in computational learning theory is Jackson’s query algorithm for PAC learning DNF formulas with respect to the uniform distribution [3]. A natural question is whether DNF formulas can be learned (even with queries and with respect to the uniform distribution) in a highly noisy setting, i.e., the wellknown agnostic framework of learning [5]. Additi...

متن کامل

PAC Meditation on Boolean Formulas

We present a Probably Approximate Correct (PAC) learning paradigm for boolean formulas, which we call PAC meditation, where the class of formulas to be learnt are not known in advance. On the contrary we split the building of the hypothesis in various levels of increasing description complexity according to additional constraints received at run time. In particular, starting from atomic forms c...

متن کامل

PAC Learning with Irrelevant Attributes

We consider the problem of learning in the presence of irrelevant attributes in Valiant's PAC model V84]. In the PAC model, the goal of the learner is to produce an approximately correct hypothesis from random sample data. If the number of relevant attributes in the target function is small, it may be desirable to produce a hypothesis that also depends on only a small number of variables. Hauss...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Theor. Comput. Sci.

دوره 230  شماره 

صفحات  -

تاریخ انتشار 2000